Skip to content

Conversation

@kooshi
Copy link
Contributor

@kooshi kooshi commented Aug 8, 2025

This PR adds 3 arguments:

  • --num-experts to make it easier to modify the number of MoE experts used
  • --omit-experts to mask out expert indices
  • --force-experts to always use the defined experts

I added this out of curiosity. I wanted to see if gpt-oss's censorship was confined to a few experts, but it appears to be dispersed throughout the experts.

Messing around with the expert selection yields mildly interesting, but exclusively degraded outputs.

Regardless, I figured I'd see if you felt it was worth adding for the sake of experimentation. If so, I'd be happy to finish cleaning it up for the merge.

@ggerganov ggerganov added the demo Demonstrate some concept or idea, not intended to be merged label Aug 11, 2025
@ggerganov
Copy link
Member

Thanks for the experiment, but the feature is not suitable for general usage. Marked the PR as a PoC indicating it's not intended to be merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

demo Demonstrate some concept or idea, not intended to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants